Entropy and Its Discontents: A Note on Definitions
نویسنده
چکیده
The routine definitions of Shannon entropy for both discrete and continuous probability laws show inconsistencies that make them not reciprocally coherent. We propose a few possible modifications of these quantities so that 1) they no longer show incongruities, 2) they go one into the other in a suitable limit as the result of a renormalization. The properties of the new quantities would slightly differ from that of the usual entropies in a few other respects PACS: 02.50.Cw, 05.45.Tp MSC: 94A17, 54C70
منابع مشابه
A Preferred Definition of Conditional Rényi Entropy
The Rényi entropy is a generalization of Shannon entropy to a one-parameter family of entropies. Tsallis entropy too is a generalization of Shannon entropy. The measure for Tsallis entropy is non-logarithmic. After the introduction of Shannon entropy , the conditional Shannon entropy was derived and its properties became known. Also, for Tsallis entropy, the conditional entropy was introduced a...
متن کاملA note on inequalities for Tsallis relative operator entropy
In this short note, we present some inequalities for relative operator entropy which are generalizations of some results obtained by Zou [Operator inequalities associated with Tsallis relative operator entropy, {em Math. Inequal. Appl.} {18} (2015), no. 2, 401--406]. Meanwhile, we also show some new lower and upper bounds for relative operator entropy and Tsallis relative o...
متن کاملA Note on the Bivariate Maximum Entropy Modeling
Let X=(X1 ,X2 ) be a continuous random vector. Under the assumption that the marginal distributions of X1 and X2 are given, we develop models for vector X when there is partial information about the dependence structure between X1 and X2. The models which are obtained based on well-known Principle of Maximum Entropy are called the maximum entropy (ME) mo...
متن کاملA multi agent method for cell formation with uncertain situation, based on information theory
This paper assumes the cell formation problem as a distributed decision network. It proposes an approach based on application and extension of information theory concepts, in order to analyze informational complexity in an agent- based system, due to interdependence between agents. Based on this approach, new quantitative concepts and definitions are proposed in order to measure the amount of t...
متن کاملEntropy and Sinai’s Theorem
These short notes cover the basic definitions and properties of entropy, with an emphasis on its use in dynamical systems. They are intended to show how the different notions of entropy that appear in probability, mathematical statistical physics can be modified for use in the study of measurable dynamical systems, culminating in one of the most striking such uses of entropy: the proof of Sinai...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Entropy
دوره 16 شماره
صفحات -
تاریخ انتشار 2014